Stein variational gradient descent with local approximations

نویسندگان

چکیده

Bayesian computation plays an important role in modern machine learning and statistics to reason about uncertainty. A key computational challenge inference is develop efficient techniques approximate, or draw samples from posterior distributions. Stein variational gradient decent (SVGD) has been shown be a powerful approximate algorithm for this issue. However, the vanilla SVGD requires calculating of target density cannot applied when unavailable too expensive evaluate. In paper we explore one way address by construction local surrogate distribution which can obtained much more computationally feasible manner. More specifically, forward model using deep neural network (DNN) trained on carefully chosen training set, also determines quality surrogate. To end, propose general adaptation procedure refine approximation online without destroying convergence resulting SVGD. This significantly reduces cost leads suite algorithms that are straightforward implement. The new illustrated set challenging inverse problems, numerical experiments demonstrate clear improvement performance applicability standard

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stein Variational Gradient Descent as Gradient Flow

Stein variational gradient descent (SVGD) is a deterministic sampling algorithm that iteratively transports a set of particles to approximate given distributions, based on a gradient-based update that guarantees to optimally decrease the KL divergence within a function space. This paper develops the first theoretical analysis on SVGD. We establish that the empirical measures of the SVGD samples...

متن کامل

VAE Learning via Stein Variational Gradient Descent

A new method for learning variational autoencoders (VAEs) is developed, based on Stein variational gradient descent. A key advantage of this approach is that one need not make parametric assumptions about the form of the encoder distribution. Performance is further enhanced by integrating the proposed encoder with importance sampling. Excellent performance is demonstrated across multiple unsupe...

متن کامل

Stein Variational Gradient Descent: Theory and Applications

Although optimization can be done very efficiently using gradient-based optimization these days, Bayesian inference or probabilistic sampling has been considered to be much more difficult. Stein variational gradient descent (SVGD) is a new particle-based inference method derived using a functional gradient descent for minimizing KL divergence without explicit parametric assumptions. SVGD can be...

متن کامل

Learning to Draw Samples with Amortized Stein Variational Gradient Descent

We propose a simple algorithm to train stochastic neural networks to draw samples from given target distributions for probabilistic inference. Our method is based on iteratively adjusting the neural network parameters so that the output changes along a Stein variational gradient direction (Liu & Wang, 2016) that maximally decreases the KL divergence with the target distribution. Our method work...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computer Methods in Applied Mechanics and Engineering

سال: 2021

ISSN: ['0045-7825', '1879-2138']

DOI: https://doi.org/10.1016/j.cma.2021.114087